|
Transfer entropy is a non-parametric statistic measuring the amount of directed (time-asymmetric) transfer of information between two random processes. Transfer entropy from a process ''X'' to another process ''Y'' is the amount of uncertainty reduced in future values of ''Y'' by knowing the past values of ''X'' given past values of ''Y''. More specifically, if and for denote two random processes and the amount of information is measured using Shannon's entropy, the transfer entropy can be written as: : where ''H''(''X'') is Shannon entropy of ''X''. The above definition of transfer entropy has been extended by other types of entropy measures such as Rényi entropy.〔 Transfer entropy is conditional mutual information, with the history of the influenced variable in the condition. Transfer entropy reduces to Granger causality for vector auto-regressive processes. Hence, it is advantageous when the model assumption of Granger causality doesn't hold, for example, analysis of non-linear signals.〔 However, it usually requires more samples for accurate estimation. While it was originally defined for bivariate analysis, transfer entropy has been extended to multivariate forms, either conditioning on other potential source variables or considering transfer from a collection of sources, although these forms require more samples again. Transfer entropy has been used for estimation of functional connectivity of neurons〔 and social influence in social networks. == See also == * Conditional mutual information * Causality * Causality (physics) * Structural equation modeling * Rubin causal model * Mutual Information 抄文引用元・出典: フリー百科事典『 ウィキペディア(Wikipedia)』 ■ウィキペディアで「Transfer entropy」の詳細全文を読む スポンサード リンク
|